Sequence Learning with Recurrent Networks: Analysis of Internal Representations

نویسندگان

  • Joydeep Ghosh
  • Vijay Karamcheti
چکیده

The recognition and learning of temporal sequences is fundamental to cognitive processing. Several recurrent networks attempt to encode past history through feedback connections from \context units". However, the internal representations formed by these networks is not well understood. In this paper, we use cluster analysis to interpret the hidden unit encodings formed when a network with context units is trained to recognize strings from a nite state machine. If the number of hidden units is small, the network forms fuzzy representations of the underlying machine states. With more hidden units, diierent representations may evolve for alternative paths to the same state. Thus, appropriate network size is indicated by the complexity of the underlying nite state machine. The analysis of internal representations can be used for modeling of an unknown system based on observation of its output sequences.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Recurrent Neural Network Learning for Text Routing

This paper describes new recurrent plausibility networks with internal recurrent hysteresis connections. These recurrent connections in multiple layers encode the sequential context of word sequences. We show how these networks can support text routing of noisy newswire titles according to different given categories. We demonstrate the potential of these networks using an 82 339 word corpus fro...

متن کامل

auDeep: Unsupervised Learning of Representations from Audio with Deep Recurrent Neural Networks

auDeep is a Python toolkit for deep unsupervised representation learning from acoustic data. It is based on a recurrent sequence to sequence autoencoder approach which can learn representations of time series data by taking into account their temporal dynamics. We provide an extensive command line interface in addition to a Python API for users and developers, both of which are comprehensively ...

متن کامل

Semi-supervised Learning with Encoder-Decoder Recurrent Neural Networks: Experiments with Motion Capture Sequences

Recent work on sequence to sequence translation using Recurrent Neural Networks (RNNs) based on Long Short Term Memory (LSTM) architectures has shown great potential for learning useful representations of sequential data. A one-to-many encoder-decoder(s) scheme allows for a single encoder to provide representations serving multiple purposes. In our case, we present an LSTM encoder network able ...

متن کامل

Methods for Learning Articulated Attractors over Internal Representations

Recurrent attractor networks have many virtues which have prompted their use in a wide variety of connectionist cognitive models. One of these virtues is the ability of these networks to learn articulated attractors — meaningful basins of attraction arising from the systematic interaction of explicitly trained patterns. Such attractors can improve generalization by enforcing “well formedness” c...

متن کامل

A Connectionist Model Of Instruction Following

In this paper we describe a general connectionist model of “learning by being told”. Unlike common network models of inductive learning which rely on the slow modification of connection weights, our model of instructed learning focuses on rapid changes in the activation state of a recurrent network. We view stable distributed patterns of activation in such a network as internal representations ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2007